2024-07-02 11:10:12.AIbase.10.0k
Proposed by Peking University and others, a training method for medical expert models has elevated an 8B model to the performance level of GPT-4.
The team from Peking University and the Hong Kong University of Science and Technology made a big splash with a new training method that has achieved GPT-4 level performance with an 8B-sized medical expert model. This is no small feat, and they have also introduced a new concept, "stability gap," to explain certain phenomena observed during the continuous pre-training of large language models.Image Source Note: The image is generated by AI, and the image is provided by Midjourney, an image autho